85 research outputs found
Free randomness can be amplified
Are there fundamentally random processes in nature? Theoretical predictions,
confirmed experimentally, such as the violation of Bell inequalities, point to
an affirmative answer. However, these results are based on the assumption that
measurement settings can be chosen freely at random, so assume the existence of
perfectly free random processes from the outset. Here we consider a scenario in
which this assumption is weakened and show that partially free random bits can
be amplified to make arbitrarily free ones. More precisely, given a source of
random bits whose correlation with other variables is below a certain
threshold, we propose a procedure for generating fresh random bits that are
virtually uncorrelated with all other variables. We also conjecture that such
procedures exist for any non-trivial threshold. Our result is based solely on
the no-signalling principle, which is necessary for the existence of free
randomness.Comment: 5+7 pages, 2 figures. Updated to match published versio
The Flat Transmission Spectrum of the Super-Earth GJ1214b from Wide Field Camera 3 on the Hubble Space Telescope
Capitalizing on the observational advantage offered by its tiny M dwarf host,
we present HST/WFC3 grism measurements of the transmission spectrum of the
super-Earth exoplanet GJ1214b. These are the first published WFC3 observations
of a transiting exoplanet atmosphere. After correcting for a ramp-like
instrumental systematic, we achieve nearly photon-limited precision in these
observations, finding the transmission spectrum of GJ1214b to be flat between
1.1 and 1.7 microns. Inconsistent with a cloud-free solar composition
atmosphere at 8.2 sigma, the measured achromatic transit depth most likely
implies a large mean molecular weight for GJ1214b's outer envelope. A dense
atmosphere rules out bulk compositions for GJ1214b that explain its large
radius by the presence of a very low density gas layer surrounding the planet.
High-altitude clouds can alternatively explain the flat transmission spectrum,
but they would need to be optically thick up to 10 mbar or consist of particles
with a range of sizes approaching 1 micron in diameter.Comment: 17 pages, 12 figures, accepted for publication in Ap
Bell Correlations and the Common Future
Reichenbach's principle states that in a causal structure, correlations of
classical information can stem from a common cause in the common past or a
direct influence from one of the events in correlation to the other. The
difficulty of explaining Bell correlations through a mechanism in that spirit
can be read as questioning either the principle or even its basis: causality.
In the former case, the principle can be replaced by its quantum version,
accepting as a common cause an entangled state, leaving the phenomenon as
mysterious as ever on the classical level (on which, after all, it occurs). If,
more radically, the causal structure is questioned in principle, closed
space-time curves may become possible that, as is argued in the present note,
can give rise to non-local correlations if to-be-correlated pieces of classical
information meet in the common future --- which they need to if the correlation
is to be detected in the first place. The result is a view resembling Brassard
and Raymond-Robichaud's parallel-lives variant of Hermann's and Everett's
relative-state formalism, avoiding "multiple realities."Comment: 8 pages, 5 figure
Non-monotonic population and coherence evolution in Markovian open-system dynamics
We consider a simple microscopic model where the open-system dynamics of a
qubit, despite being Markovian, shows features which are typically associated
to the presence of memory effects. Namely, a non monotonic behavior both in the
population and in the coherence evolution arises due to the presence of
non-secular contributions, which break the phase covariance of the Lindbladian
(semigroup) dynamics. We also show by an explicit construction how such a
non-monotonic behaviour can be reproduced by a phase covariant evolution, but
only at the price of inserting some state-dependent memory effects.Comment: Submitted to the proceedings of the 684. WE-Heraeus-Seminar "Advances
in open systems and fundamental tests of quantum mechanics
Testing foundations of quantum mechanics with photons
The foundational ideas of quantum mechanics continue to give rise to
counterintuitive theories and physical effects that are in conflict with a
classical description of Nature. Experiments with light at the single photon
level have historically been at the forefront of tests of fundamental quantum
theory and new developments in photonics engineering continue to enable new
experiments. Here we review recent photonic experiments to test two
foundational themes in quantum mechanics: wave-particle duality, central to
recent complementarity and delayed-choice experiments; and Bell nonlocality
where recent theoretical and technological advances have allowed all
controversial loopholes to be separately addressed in different photonics
experiments.Comment: 10 pages, 5 figures, published as a Nature Physics Insight review
articl
Highly symmetric POVMs and their informational power
We discuss the dependence of the Shannon entropy of normalized finite rank-1
POVMs on the choice of the input state, looking for the states that minimize
this quantity. To distinguish the class of measurements where the problem can
be solved analytically, we introduce the notion of highly symmetric POVMs and
classify them in dimension two (for qubits). In this case we prove that the
entropy is minimal, and hence the relative entropy (informational power) is
maximal, if and only if the input state is orthogonal to one of the states
constituting a POVM. The method used in the proof, employing the Michel theory
of critical points for group action, the Hermite interpolation and the
structure of invariant polynomials for unitary-antiunitary groups, can also be
applied in higher dimensions and for other entropy-like functions. The links
between entropy minimization and entropic uncertainty relations, the Wehrl
entropy and the quantum dynamical entropy are described.Comment: 40 pages, 3 figure
Different approaches and limitations for testing phytoplankton viability in natural assemblies and treated ballast water
Shipping is recognised as an unintentional efficient pathway for spreading non-native species, harmful organisms and pathogens. In 2004, a unique IMO Convention was adopted to control and minimize this transfer in ship's ballast water. This Convention entered into force on 8th September 2017. However, unlikely the majority of IMO Conventions, the Ballast Water Management Convention requires ships to comply with biological standards (e.g. concentration of organisms per unit of volume in ballast water discharges). This study aimed to apply different techniques developed to measure concentrations of viable phytoplankton in natural and treated ballast water samples and compare them with the established flow cytometry method and vital staining microscopy. Samples were collected in the English Channel over one year and on-board during ballast water shipboard efficacy tests. Natural abundance of live phytoplankton varied from 23% to 89% of the total, while for cells larger than 10âŻÎŒm (a size defined by the BWM Convention) the percentage varied from 3% to 60%. An overall good correlation was seen between the measurements taken with the two fluorometers and in comparison with the flow cytometry analysis, as found in previous studies. Analysis of treated ballast water samples showed a large variation in the number of viable cells, however indicating a low level of risk on all occasions for regulatory purposes. One of the key aspects to bear in mind when sampling and analysing for compliance is to be aware of the limitations of each technique
Helical twisting power of chiral side-chain liquid crystal copolymers
Of course not, but if one believes that information cannot be destroyed in a theory of quantum gravity, then we run into apparent contradictions with quantum theory when we consider evaporating black holes. Namely that the no-cloning theorem or the principle of entanglement monogamy is violated. Here, we show that neither violation need hold, since, in arguing that black holes lead to cloning or non-monogamy, one needs to assume a tensor product structure between two points in space-time that could instead be viewed as causally connected. In the latter case, one is violating the semi-classical causal structure of space, which is a strictly weaker implication than cloning or non-monogamy. This is because both cloning and non-monogamy also lead to a break-down of the semi-classical causal structure. We show that the lack of monogamy that can emerge in evaporating space times is one that is allowed in quantum mechanics, and is very naturally related to a lack of monogamy of correlations of outputs of measurements performed at subsequent instances of time of a single system. This is due to an interesting duality between temporal correlations and entanglement. A particular example of this is the Horowitz-Maldacena proposal, and we argue that it neednât lead to cloning or violations of entanglement monogamy. For measurements on systems which appear to be leaving a black hole, we introduce the notion of the temporal product, and argue that it is just as natural a choice for measurements as the tensor product. For black holes, the tensor and temporal products have the same measurement statistics, but result in different type of non-monogamy of correlations, with the former being forbidden in quantum theory while the latter is allowed. In the case of the AMPS firewall experiment we find that the entanglement structure is modified, and one must have entanglement between the infalling Hawking partners and early time outgoing Hawking radiation which surprisingly tames the violation of entanglement monogamy
- âŠ